151 research outputs found

    Global sensitivity analysis of computer models with functional inputs

    Get PDF
    Global sensitivity analysis is used to quantify the influence of uncertain input parameters on the response variability of a numerical model. The common quantitative methods are applicable to computer codes with scalar input variables. This paper aims to illustrate different variance-based sensitivity analysis techniques, based on the so-called Sobol indices, when some input variables are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary meta-modeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked Generalized Linear Models (GLM) or Generalized Additive Models (GAM). The ``mean'' model allows to estimate the sensitivity indices of each scalar input variables, while the ``dispersion'' model allows to derive the total sensitivity index of the functional input variables. The proposed approach is compared to some classical SA methodologies on an analytical function. Lastly, the proposed methodology is applied to a concrete industrial computer code that simulates the nuclear fuel irradiation

    Uncertainty and sensitivity analysis of functional risk curves based on Gaussian processes

    Full text link
    A functional risk curve gives the probability of an undesirable event as a function of the value of a critical parameter of a considered physical system. In several applicative situations, this curve is built using phenomenological numerical models which simulate complex physical phenomena. To avoid cpu-time expensive numerical models, we propose to use Gaussian process regression to build functional risk curves. An algorithm is given to provide confidence bounds due to this approximation. Two methods of global sensitivity analysis of the models' random input parameters on the functional risk curve are also studied. In particular, the PLI sensitivity indices allow to understand the effect of misjudgment on the input parameters' probability density functions

    Global Sensitivity Analysis of Stochastic Computer Models with joint metamodels

    Get PDF
    The global sensitivity analysis method, used to quantify the influence of uncertain input variables on the response variability of a numerical model, is applicable to deterministic computer code (for which the same set of input variables gives always the same output value). This paper proposes a global sensitivity analysis methodology for stochastic computer code (having a variability induced by some uncontrollable variables). The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, non parametric joint models (based on Generalized Additive Models and Gaussian processes) are discussed. The relevance of these new models is analyzed in terms of the obtained variance-based sensitivity indices with two case studies. Results show that the joint modeling approach leads accurate sensitivity index estimations even when clear heteroscedasticity is present

    Latin hypercube sampling with inequality constraints

    Get PDF
    In some studies requiring predictive and CPU-time consuming numerical models, the sampling design of the model input variables has to be chosen with caution. For this purpose, Latin hypercube sampling has a long history and has shown its robustness capabilities. In this paper we propose and discuss a new algorithm to build a Latin hypercube sample (LHS) taking into account inequality constraints between the sampled variables. This technique, called constrained Latin hypercube sampling (cLHS), consists in doing permutations on an initial LHS to honor the desired monotonic constraints. The relevance of this approach is shown on a real example concerning the numerical welding simulation, where the inequality constraints are caused by the physical decreasing of some material properties in function of the temperature

    Clustering "optimal" dans des espaces fonctionnels

    Get PDF
    International audienceComputer codes used in support of nuclear industry are more and more complex, and consequently more and more CPU time consuming. We are here interested in such code, in the special case of functional output : the code output represents the evolutions of some physical parameters in time. Those last curves are functions from an interval IRI \subset \R to R\R, which will be preprocessed in order to cluster them in a few meaningful groups (clustering, or unsupervised classification). The aim of our work is the estimation of the convergence speed of clustering error estimates. After finding bounds on convergence speeds, we will illustrate this on an example with six distinct groups of curves

    Derivative-based global sensitivity measures: general links with Sobol' indices and numerical tests

    Get PDF
    The estimation of variance-based importance measures (called Sobol' indices) of the input variables of a numerical model can require a large number of model evaluations. It turns to be unacceptable for high-dimensional model involving a large number of input variables (typically more than ten). Recently, Sobol and Kucherenko have proposed the Derivative-based Global Sensitivity Measures (DGSM), defined as the integral of the squared derivatives of the model output, showing that it can help to solve the problem of dimensionality in some cases. We provide a general inequality link between DGSM and total Sobol' indices for input variables belonging to the class of Boltzmann probability measures, thus extending the previous results of Sobol and Kucherenko for uniform and normal measures. The special case of log-concave measures is also described. This link provides a DGSM-based maximal bound for the total Sobol indices. Numerical tests show the performance of the bound and its usefulness in practice

    Screening and metamodeling of computer experiments with functional outputs. Application to thermal-hydraulic computations

    Get PDF
    To perform uncertainty, sensitivity or optimization analysis on scalar variables calculated by a cpu time expensive computer code, a widely accepted methodology consists in first identifying the most influential uncertain inputs (by screening techniques), and then in replacing the cpu time expensive model by a cpu inexpensive mathematical function, called a metamodel. This paper extends this methodology to the functional output case, for instance when the model output variables are curves. The screening approach is based on the analysis of variance and principal component analysis of output curves. The functional metamodeling consists in a curve classification step, a dimension reduction step, then a classical metamodeling step. An industrial nuclear reactor application (dealing with uncertainties in the pressurized thermal shock analysis) illustrates all these steps

    Numerical studies of space filling designs: optimization of Latin Hypercube Samples and subprojection properties

    Get PDF
    International audienceQuantitative assessment of the uncertainties tainting the results of computer simulations is nowadays a major topic of interest in both industrial and scientific communities. One of the key issues in such studies is to get information about the output when the numerical simulations are expensive to run. This paper considers the problem of exploring the whole space of variations of the computer model input variables in the context of a large dimensional exploration space. Various properties of space filling designs are justified: interpoint-distance, discrepancy, minimum spanning tree criteria. A specific class of design, the optimized Latin Hypercube Sample, is considered. Several optimization algorithms, coming from the literature, are studied in terms of convergence speed, robustness to subprojection and space filling properties of the resulting design. Some recommendations for building such designs are given. Finally, another contribution of this paper is the deep analysis of the space filling properties of the design 2D-subprojections
    corecore